Spaces with Lusternik–Schnirelmann category n and cone length n+1
نویسندگان
چکیده
منابع مشابه
Some Results on TVS-cone Normed Spaces and Algebraic Cone Metric Spaces
In this paper we introduce the cone bounded linear mapping and demonstrate a proof to show that the cone norm is continuous. Among other things, we prove the open mapping theorem and the closed graph theorem in TVS-cone normed spaces. We also show that under some restrictions on the cone, two cone norms are equivalent if and only if the topologies induced by them are the same. In the sequel, we...
متن کاملOn the category of geometric spaces and the category of (geometric) hypergroups
In this paper first we define the morphism between geometric spaces in two different types. We construct two categories of $uu$ and $l$ from geometric spaces then investigate some properties of the two categories, for instance $uu$ is topological. The relation between hypergroups and geometric spaces is studied. By constructing the category $qh$ of $H_{v}$-groups we answer the question...
متن کاملCone normed spaces
In this paper, we introduce the cone normed spaces and cone bounded linear mappings. Among other things, we prove the Baire category theorem and the Banach--Steinhaus theorem in cone normed spaces.
متن کاملVariable-length category-based n-grams for language modelling
This report concerns the theoretical development and subsequent evaluation of n-gram language models based on word categories. In particular, part-of-speech word classifications have been employed as a means of incorporating significant amounts of a-priori grammatical information into the model. The utilisation of categories diminishes the problem of data sparseness which plagues conventional w...
متن کاملA variable-length category-based n-gram language model
A language model based on word-category n-grams and ambiguous category membership with n increased selectively to trade compactness for performance is presented. The use of categories leads intrinsically to a compact model with the ability to generalise to unseen word sequences, and diminishes the spareseness of the training data, thereby making larger n feasible. The language model implicitly ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Topology
سال: 2000
ISSN: 0040-9383
DOI: 10.1016/s0040-9383(99)00047-6